Overview of the NTCIR-5 Cross-Lingual Question Answering Task (CLQA1)

نویسندگان

  • Yutaka Sasaki
  • Hsin-Hsi Chen
  • Kuang-hua Chen
  • Chuan-Jie Lin
چکیده

This paper gives an overview of the NTCIR-5 Cross-Lingual Question Answering Task (CLQA1), an evaluation campaign for Cross-Lingual Question Answering technology. This evaluation was carried out in June 2005. In CLQA1, we aimed to promote research on cross-lingual Question Answering technology mainly for East Asian languages. As the first attempt, we conducted evaluations of five subtasks: JE, EJ, CE, CC, and EC subtasks, where C, E, and J stand for Chinese, English, and Japanese, respectively, and XY indicates that questions are given in language X and answers are extracted from documents written in language Y. For the purpose of system development, we provided 200-300 sample question/answer pairs for each subtask. The Formal Run evaluation was conducted during June 13-27, 2005 with 200 test questions. As a result, 13 research institutes world-wide participated in CLQA1, and 89 runs were submitted in total.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Overview of the NTCIR-6 Cross-Lingual Question Answering (CLQA) Task

This paper describes an overview of the NTCIR-6 Cross-Lingual Question Answering (CLQA) Task, an evaluation campaign for Cross-Lingual Question Answering technology. In NTCIR-5, the first CLQA task targeting Chinese, English, and Japanese languages was carried out. Following the success of NTCIR-5 CLQA, NTCIR-6 hosted the second campaign on the CLQA task. Since the handling of Named Entities is...

متن کامل

System Description of NTOUA Group in CLQA1

CLQA1 is the first large scale evaluation on Chinese question answering. Our group participated in the C-E subtask. We augmented our monolingual Chinese QA system to handle cross-lingual QA. A bilingual dictionary and online web search engines were used to do the translation. Six runs were submitted at last, and the best run could provide correct answers of 8 of 200 questions at top 1 and 22 of...

متن کامل

Overview of the NTCIR-7 ACLIA IR4QA Task

This paper presents an overview of the IR4QA (Information Retrieval for Question Answering) Task of the NTCIR-7 ACLIA (Advanced Cross-lingual Information Access) Task Cluster. IR4QA evaluates traditional ranked retrieval of documents using wellstudied metrics such as Average Precision, but the retrieval task is embedded in the context of cross-lingual question answering. That is, document retri...

متن کامل

NiCT/ATR in NTCIR-7 CCLQA Track: Answering Complex Cross-lingual Questions

This paper describes our complex cross-lingual question answering (CCLQA) system for NTCIR-7 ACLIA track. To answer complex questions such as events, biographies, definitions, and relations, we designed two models, i.e., the centroid-vector model and the SVMbased model. In the official evaluation of the NTCIR7 CCLQA track, our SVM-based model achieved 22.11% F-score in the English-Chinese cross...

متن کامل

Overview of the Eighth NTCIR Workshop

For the Eighth NTCIR Workshop (NTCIR-8), we selected and organized seven research areas as “tasks” to investigate, test and benchmark newly constructed test collections. These areas are Complex Cross-Lingual Question Answering (CCLQA), Information Retrieval for Question Answering (IR4QA), Geographic and Temporal Search (GeoTime), Multilingual Opinion Analysis (MOAT), Patent Machine Translation ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2005